Snap announces new camera, augmented reality experiences
Snap Inc. announced new camera and augmented reality experiences for developers, creators, and Snapchatters:
Lens Studio is a free desktop application designed for developers and artists to build and distribute AR Lenses on Snapchat. Lens Studio now features SnapML, which lets any developer bring their own machine learning models to power Lenses. Now, anyone can create their own Lenses with neural networks that they’ve trained, expanding the possibilities for Lenses that can transform the world. Snap has partnered with Wannaby, Prisma, CV2020, and several Official Lens Creators on their first SnapML creations.
In addition, Lens Studio now offers Face Landmarks and Face Expressions for improved facial tracking, new Hand Gesture templates, and an updated user interface to simplify navigation within the tool. Lens Studio is also releasing a foot tracking template powered by an ML model from Wannaby that lets developers easily create Lenses that interact with feet.
Visit lensstudio.snapchat.com to download and get started in Lens Studio.
Today, Snap is also previewing Local Lenses, which enable a persistent, shared AR world built right on top of your neighborhood.
\When Snapchatters “press and hold” on the camera screen, relevant, helpful Lenses are unlocked based on what they see in front of them.
Today, Snap is introducing new Scan partners:
With PlantSnap, Snapchatters can identify 90% of all known plants and trees. In partnership with Dog Scanner, point the Snapchat camera to recognize almost 400 breeds. An integration with Yuka later this year powers Nutrition Scanner, which provides a rating on the quality of ingredients in many packaged foods by scanning an item’s label. Scan can also enable compelling experiences for brands.
Additionally, Snap is introducing Voice Scan, which offers Snapchatters Lens results based on voice commands. Powered by a partnership with SoundHound, press and hold on the camera screen to tell Snapchat what kind of Lens to display.